Orchestration Management

When we run our integration tests, the PostgreSQL database engine must already be running in the background. It must also already be configured, for example, with a new database ready to be used. Additionally, when all the tests have been executed, the database should be removed, and the database engine needs to stop running.

This job is most suited for Docker, which can run complex systems in isolation with minimal configuration. Here, we have a choice: either choose to orchestrate the creation and destruction of the database with an external script or implement everything in the test suite. Since the first solution is what many frameworks tend to use, in this chapter, we’ll show an implementation of that solution.

We’ve planned to create a management script that spins up and tears down the required containers and runs the tests. The management script can also be used to run the application itself or to create development setups, but in this case, we’ll simplify it only to manage the tests.

Update the requirements#

The first thing we need to do when we use Docker Compose is to add the requirement to the requirements/test.txt file, as we can see in the code below:

requirements/test.txt

Update the management script#

The management script is given below:

manage.py

Code explanation#

Let’s see what the code above does, block by block.

Some Docker containers (like the PostgreSQL container we’ll use shortly) depend on environment variables to perform the initial setup. Therefore, from lines 1 to 19, we define a function to set environment variables if they’re not already initialized. We also describe a few paths to our configuration files.

From lines 22 to 27, we introduce app_config_file and docker_compose_file, which return the specific file for the environment we are working in. We do this because, in principle, we expect to have a different configuration for development, testing, and production. From lines 29 to 45, we isolate the read_json_configuration function from the configure_app function since it will be imported by the tests to initialize the database repository.

In line 53, we introduce the function docker_compose_cmdline function. This simple function creates the Docker Compose command line that helps us make sure we don’t repeat long lists of options whenever we need to orchestrate our containers.

In line 76, we insert the run_sql function, which allows us to run SQL commands on our PostgreSQL database. This function will also help us when we create the empty test database. In line 94, we introduce the wait_for_logs function, which allows us to monitor the PostgreSQL container and ensure it’s ready to use. Whenever we spin up containers programmatically, we need to be aware that they have a certain start-up time before they’re ready so that we can act accordingly.

Orchestration of different technologies

In line 103, we define the test function, the only command our management script provided. First, we configure the application with the name testing, which means we’ll use the config/testing.json configuration file and the docker/testing.yml Docker Compose file. All of these names and paths are simply conventions that come from the arbitrary setup of this management script.

The function then spins up the containers according to the Docker Compose file and runs the docker-compose up -d command. It waits for a log message, which tells us that the database is ready to accept connections. It then runs the SQL command that creates the testing database. Following this, it runs pytest with a default set of options and adds all the options we provide on the command line. Finally, it tears down the Docker Compose containers.

Update the Docker Compose configuration#

To complete the setup, we define the following configuration file for Docker Compose, which we can see in the code below:

docker/testing.yml

Adding JSON configuration#

Finally, we add a JSON configuration file, as shown below:

config/testing.json

Let’s keep a few points about this configuration in mind. First, it defines the FLASK_ENV and FLASK_CONFIG variables. The former, FLASK_ENV, is an internal Flask variable that can only be defined with the values development or production. It’s also connected with the internal debugger. The latter, FLASK_CONFIG, is the variable we use to configure our Flask application with the objects contained in the application/config.py.

For testing purposes, we set the FLASK_ENV variable to the value production, since we don’t need the internal debugger. We also set the FLASK_CONFIG variable to the value test, so the application is configured with the TestingConfig class. This class sets the internal Flask parameter TESTING to True.

The rest of the JSON configuration initializes variables whose names start with the POSTGRES_ prefix. These are variables that are required by the PostgreSQL Docker container. When the container runs, it automatically creates a database with a name specified by the POSTGRES_DB variable. It also creates a user with a password, with the values specified in the POSTGRES_USER and POSTGRES_PASSWORD variables.

Finally, we introduce the variable APPLICATION_DB because we want to create a specific database that is not our default. The default port POSTGRES_PORT changes from the standard value 5432 to 5433 so that it doesn’t clash with any database that’s already running on the machine (either natively or containerized). As we can see in the Docker Compose configuration file, this changes only the external mapping of the container. It doesn’t change the actual port the database engine uses inside the container.

We’re ready to design our tests with all these files in place.

Updated code#

We’ve updated our files with the new code in the code editor.

/
application
config
docker
rentomatic
requirements
tests
cli.py
manage.py
pyproject.toml
pytest.ini
requirements.txt
setup.cfg
tox.ini
wsgi.py
Configure the system for multiple technologies

Create SQLAlchemy Classes

Database Fixtures